首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   531篇
  免费   54篇
  国内免费   142篇
测绘学   174篇
大气科学   77篇
地球物理   72篇
地质学   123篇
海洋学   166篇
天文学   6篇
综合类   67篇
自然地理   42篇
  2023年   5篇
  2022年   13篇
  2021年   19篇
  2020年   24篇
  2019年   12篇
  2018年   28篇
  2017年   21篇
  2016年   22篇
  2015年   17篇
  2014年   33篇
  2013年   47篇
  2012年   37篇
  2011年   32篇
  2010年   33篇
  2009年   41篇
  2008年   31篇
  2007年   34篇
  2006年   23篇
  2005年   24篇
  2004年   24篇
  2003年   27篇
  2002年   22篇
  2001年   19篇
  2000年   16篇
  1999年   23篇
  1998年   13篇
  1997年   10篇
  1996年   9篇
  1995年   14篇
  1994年   8篇
  1993年   6篇
  1992年   8篇
  1991年   7篇
  1990年   5篇
  1989年   5篇
  1988年   2篇
  1987年   1篇
  1985年   1篇
  1984年   1篇
  1982年   2篇
  1977年   1篇
  1976年   2篇
  1973年   3篇
  1971年   2篇
排序方式: 共有727条查询结果,搜索用时 15 毫秒
41.
The paper studies the effect of magnitude errors on heterogeneous catalogs, by applying the apparent magnitude theory (seeTinti andMulargia, 1985a), which proves to be the most natural and rigorous approach to the problem. Heterogeneities in seismic catalogs are due to a number of various sources and affect both instrumental as well as noninstrumental earthquake compilations.The most frequent basis of heterogeneity is certainly that the recent instrumental records are to be combined with the historic and prehistoric event listings to secure a time coverage, considerably longer than the recurrence time of the major earthquakes. Therefore the case which attracts the greatest attention in the present analysis is that of a catalog consisting of a subset of higher quality data, generallyS 1, spanning the interval T 1 (the instrumental catalog), and of a second subset of more uncertain magnitude determination, generallyS 2, covering a vastly longer interval T 2 (the historic and/or the geologic catalog). The magnitude threshold of the subcatalogS 1 is supposedly smaller than that ofS 2, which, as we will see, is one of the major causes of discrepancy between the apparent magnitude and the true magnitude distributions. We will further suppose that true magnitude occurrences conform to theGutenberg-Richter (GR) law, because the assumption simplified the analysis without reducing the relevancy of our findings.The main results are: 1) the apparent occurrence rate exceeds the true occurrence rate from a certain magnitude onward, saym GR; 2) the apparent occurrence rate shows two distinct GR regimes separated by an intermediate transition region. The offset between the two regimes is the essential outcome ofS 1 being heterogeneous with respect toS 2. The most important consequences of this study are that: 1) it provides a basis to infer the parameters of the true magnitude distribution, by correcting the bias deriving from heterogeneous magnitude errors; 2) it demonstrates that the double GR decay, that several authors have taken as the incontestable proof of the failure of the GR law and of the experimental evidence of the characteristic earthquake theory, is instead perfectly consistent with a GR-type seismicity.  相似文献   
42.
本文详细地分析了用数字化仪和激光扫描仪对模拟加速度记录进行数字化时所产生的误差及消除这些误差的方法,并开发了相关处理软件。数字化误差由数字化设备的系统误差和读数员在操作过程中的随机误差迭加而成,随机数字化误差是具有各态历经性质的、其振幅按高斯规律分布的平稳随机过程。利用激光扫描仪做强震记录数字化,工作效率很高。本文给出了激光扫描仪分析处理软件和消除数字化噪声实例。  相似文献   
43.
We present a methodology able to infer the influence of rainfall measurement errors on the reliability of extreme rainfall statistics. We especially focus on systematic mechanical errors affecting the most popular rain intensity measurement instrument, namely the tipping-bucket rain-gauge (TBR). Such uncertainty strongly depends on the measured rainfall intensity (RI) with systematic underestimation of high RIs, leading to a biased estimation of extreme rain rates statistics. Furthermore, since intense rain-rates are usually recorded over short intervals in time, any possible correction strongly depends on the time resolution of the recorded data sets. We propose a simple procedure for the correction of low resolution data series after disaggregation at a suitable scale, so that the assessment of the influence of systematic errors on rainfall statistics become possible. The disaggregation procedure is applied to a 40-year long rain-depth dataset recorded at hourly resolution by using the IRP (Iterated Random Pulse) algorithm. A set of extreme statistics, commonly used in urban hydrology practice, have been extracted from simulated data and compared with the ones obtained after direct correction of a 12-year high resolution (1 min) RI series. In particular, the depth–duration–frequency curves derived from the original and corrected data sets have been compared in order to quantify the impact of non-corrected rain intensity measurements on design rainfall and the related statistical parameters. Preliminary results suggest that the IRP model, due to its skill in reproducing extreme rainfall intensities at fine resolution in time, is well suited in supporting rainfall intensity correction techniques.  相似文献   
44.
45.
46.
Bayesian data fusion in a spatial prediction context: a general formulation   总被引:1,自引:1,他引:1  
In spite of the exponential growth in the amount of data that one may expect to provide greater modeling and predictions opportunities, the number and diversity of sources over which this information is fragmented is growing at an even faster rate. As a consequence, there is real need for methods that aim at reconciling them inside an epistemically sound theoretical framework. In a statistical spatial prediction framework, classical methods are based on a multivariate approach of the problem, at the price of strong modeling hypotheses. Though new avenues have been recently opened by focusing on the integration of uncertain data sources, to the best of our knowledges there have been no systematic attemps to explicitly account for information redundancy through a data fusion procedure. Starting from the simple concept of measurement errors, this paper proposes an approach for integrating multiple information processing as a part of the prediction process itself through a Bayesian approach. A general formulation is first proposed for deriving the prediction distribution of a continuous variable of interest at unsampled locations using on more or less uncertain (soft) information at neighboring locations. The case of multiple information is then considered, with a Bayesian solution to the problem of fusing multiple information that are provided as separate conditional probability distributions. Well-known methods and results are derived as limit cases. The convenient hypothesis of conditional independence is discussed by the light of information theory and maximum entropy principle, and a methodology is suggested for the optimal selection of the most informative subset of information, if needed. Based on a synthetic case study, an application of the methodology is presented and discussed.  相似文献   
47.
再论拟准检定法的原理、实施和应用   总被引:2,自引:0,他引:2  
综合阐述了拟准检定法的原理和特点、研究思路。拟准检定法的关键是如何正确选择拟准观测,文章介绍了初选的复选拟准观测的实施要点。列举了拟准检定法在图相关情况下的相差检测,形变分析中的异常探测以及GPS相位观测的周跳检测和修复等方面的应用例子。  相似文献   
48.
VLBI、SLR、GPS综合数据处理方案研究   总被引:1,自引:0,他引:1  
徐天河  杨元喜 《测绘工程》2002,11(4):7-10,21
按参与平差观测值类型的选取不同,对VLBI、SLR、GPS综合处理方案进行了分类。给出了各类方案的平差模型,结合当前我国三种技术的实测情况,分析了各种方案的优缺点,提出了较为可行的综合处理方案。  相似文献   
49.
杨克明  陈秀凤  张守峰  林建 《气象》2001,27(4):35-37
利用ECMWF850hPa风场模式的24,48,72小时风场预报格点资料和风场实况资料,对1998年6-8月长江流域出现的13次致洪暴雨,大暴雨天气 850hPa风场和500hPa,850hPa上暴雨的影响天气系统进行了预报能力的分析检验,所得结果有利于数值预报产品的释用和实时预报业务的订正使用,以提高暴雨预报能力。  相似文献   
50.
李莉  赵俊英  颜宏  黄丽萍 《气象》2001,27(3):3-7
介绍了一种用于减少有限区气候模式中嵌套误差积累的新方法——三维嵌套方法 ,详细阐述了这种方法提出的原因、必要性和它的理论基础。在有限区气候模式上进行三维嵌套与二维嵌套的对比试验 ,试验结果表明三维嵌套方案可减少嵌套预报误差的积累 ,并且对于降水的预报也有很大的改善。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号